Overcomplete Image Representations for Texture Analysis

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Overcomplete Representations

In an overcomplete basis, the number of basis vectors is greater than the dimensionality of the input, and the representation of an input is not a unique combination of basis vectors. Overcomplete representations have been advocated because they have greater robustness in the presence of noise, can be sparser, and can have greater flexibility in matching structure in the data. Overcomplete code...

متن کامل

Image De-Quantizing via Enforcing Sparseness in Overcomplete Representations

We describe a method for removing quantization artifacts (de-quantizing) in the image domain, by enforcing a high degree of sparseness in its representation with an overcomplete oriented pyramid. For this purpose we devise a linear operator that returns the minimum L2-norm image preserving a set of significant coefficients, and estimate the original by minimizing the cardinality of that subset,...

متن کامل

Energy-Based Models for Sparse Overcomplete Representations

We present a new way of extending independent components analysis (ICA) to overcomplete representations. In contrast to the causal generative extensions of ICA which maintain marginal independence of sources, we define features as deterministic (linear) functions of the inputs. This assumption results in marginal dependencies among the features, but conditional independence of the features give...

متن کامل

Sparse Overcomplete Word Vector Representations

Current distributed representations of words show little resemblance to theories of lexical semantics. The former are dense and uninterpretable, the latter largely based on familiar, discrete classes (e.g., supersenses) and relations (e.g., synonymy and hypernymy). We propose methods that transform word vectors into sparse (and optionally binary) vectors. The resulting representations are more ...

متن کامل

Learning Nonlinear Overcomplete Representations for Efficient Coding

We derive a learning algorithm for inferring an overcomplete basis by viewing it as probabilistic model of the observed data. Overcomplete bases allow for better approximation of the underlying statistical density. Using a Laplacian prior on the basis coefficients removes redundancy and leads to representations that are sparse and are a nonlinear function of the data. This can be viewed as a ge...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ELCVIA Electronic Letters on Computer Vision and Image Analysis

سال: 2014

ISSN: 1577-5097

DOI: 10.5565/rev/elcvia.586